-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Documentation update: add readthedocs with sphinx #84
base: main
Are you sure you want to change the base?
Conversation
…enerated docs, inference and models doesnt show up. Need to investigate this issue further
…still needs to be completed
…restructure proposed in recent issue.
… contributing.md as updated material is now in the readthedocs
689a9f2
to
a33b1e0
Compare
…cs finds it and generates it
…exlude test files from autodocs
…opposed to in the __init__)
…Improve details of submitting PRs. Add information for reveiwers on how to merge pull requests.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much Alex, this is awesome!! Just some minor comments, mostly just to make sure all the documentation is up to date with the recent restructuring PRs and slight changes to installation setup etc.
I think we can continue to edit the wording in the documentation as we go but no need for any major changes in this PR.
- Refer to the [API Reference](https://sparsecoding.readthedocs.io/en/latest/api.html) for detailed usage of the library's features. | ||
|
||
|
||
## Setup |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should match whatever changes are made in #91
4. Navigate into the repo and install the requirements using `pip install -r requirements.txt` | ||
5. Install the natural images dataset from this link: https://rctn.org/bruno/sparsenet/IMAGES.mat | ||
6. Try running the demo notebook: `examples/sparse_coding.ipynb` | ||
Historically, sparse coding has been largely focused on learning sparse representations of images and we provide visualization and transformation tools to work with such data. However, we’ve tried to structure the transformation, dictionary learning methods, and inference methods in a manner that is data-agnostic, making them applicable to a wide range of use cases. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
nits:
- add comma after "representations of images"
- remove the "we've tried" -> "However, we structure the transformations, dictionary..."
6. Try running the demo notebook: `examples/sparse_coding.ipynb` | ||
Historically, sparse coding has been largely focused on learning sparse representations of images and we provide visualization and transformation tools to work with such data. However, we’ve tried to structure the transformation, dictionary learning methods, and inference methods in a manner that is data-agnostic, making them applicable to a wide range of use cases. | ||
|
||
We believe that sharing code within the scientific community is an important part of science and we hope that the research community finds this library useful. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i think we can remove this paragraph?
regardless i think we'll edit wording as we go so not too big of a deal for now
|
||
If you’ve identified a new feature to add or a bug you can fix, follow these steps: | ||
|
||
#. Clone the ``main`` branch. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
sorry this wouldn't have been an issue if i were more prompt in reviewing but will need to update with changes from #89
inference, and data processing. It was written to be a useful research tool | ||
for applying various sparse coding methods to data. | ||
|
||
We believe that sharing code within the scientific community is an important |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
delete?
@@ -0,0 +1,20 @@ | |||
============ | |||
Installation |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
same as readme, update with #91
Parameters | ||
---------- | ||
sparsity : scalar (1,) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is there a reason to show the size for scalar and string (line 20)? if so, should do consistently across docstrings. if not, can remove (can do this in a separate pr if it's too much for this one)
n_iter : scalar (1,) default=100 | ||
number of iterations to run for an inference method | ||
return_all_coefficients : string (1,) default=False | ||
returns all coefficients during inference procedure if True |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
i can see this being slightly confusing (which dimension does "all" refer to), maybe change wording to something like "returns coefficients for all inference steps"
and also do for all methods with this arg
btw there are some inconsistencies with this arg across inference methods so i opened another issue i can fix separately #93
I've added sphinx support with autodoc and autosummary with utilizes inline code blocks to generate documentation. In addition to this, I've added three additional pages with provide background info:
docs/install.rst
: goes though installation proceduredocs/quickstart.rst
: provides high-level info regarding repo structure, and pointer to notebooksdocs/contributing.rst
: info regarding how to contribute (largely fromdocs/contributing.md
- which is now deleted - and includes some updates)Updated the readme to provide pointers to these new resources and added some additional information about the repo.
I also updated the
__init__.py
such that they import library functionalities, such that tab autocomplete works in IPython.